Brian E. J. Rose, University at Albany
This document uses the interactive IPython notebook
format (now also called Jupyter
). The notes can be accessed in several different ways:
github
at https://github.com/brian-rose/ClimateModeling_coursewareMany of these notes make use of the climlab
package, available at https://github.com/brian-rose/climlab
In this assignment you will investigate how the CESM slab ocean model responds to a doubling of atmospheric CO2.
[your last name].ipynb
, e.g. my notebook should be called Rose.ipynb
. Here we investigate differences between the control simulation and the 2xCO2 simulation (after it has reached its new, warmer equilibrium).
What are the clear-sky and cloudy-sky components of those changes?
Make well-labeled maps of the change in the annual mean of these five quantities:
Comment on what you found in your maps.
Here we investigate the transient adjustment to equilibrium.
For this, we will use the file
som_2xCO2.cam.h0.global.nc
This file contains a monthly timeseries of the CESM model output from the 2xCO2 model run, which was initialized from the control run. Every variable in this file has already been averaged globally. We can use the timeseries to look at the adjustment of the global average temperature and energy budget to the new equilibrium.
Here we still study the annual cycle in global mean surface temperature and verify it against observations. For observations, we will use the NCEP Reanalysis data.
Reanalysis data is really a blend of observations and output from numerical weather prediction models. It represents our “best guess” at conditions over the whole globe, including regions where observations are very sparse.
The necessary data are all served up over the internet. We will look at monthly climatologies averaged over the 30 year period 1981 - 2010.
The data catalog is here, please feel free to browse: http://www.esrl.noaa.gov/psd/thredds/dodsC/Datasets/ncep.reanalysis.derived/catalog.html
Surface air temperature is contained in a file called air.2m.mon.1981-2010.ltm.nc
, which is found in the directory surface_gauss
.
Here's a link directly to the catalog page for this data file: http://www.esrl.noaa.gov/psd/thredds/dodsC/Datasets/ncep.reanalysis.derived/surface_gauss/catalog.html?dataset=Datasets/ncep.reanalysis.derived/surface_gauss/air.2m.mon.1981-2010.ltm.nc
Now click on the OPeNDAP
link. A page opens up with lots of information about the contents of the file. The Data URL
is what we need to read the data into our Python session. For example, this code opens the file and displays a list of the variables it contains:
import netCDF4 as nc
ncep_air2m = nc.Dataset("http://www.esrl.noaa.gov/psd/thredds/dodsC/Datasets/ncep.reanalysis.derived/surface_gauss/air.2m.mon.1981-2010.ltm.nc")
for v in ncep_air2m.variables: print v
The temperature data is called air
. Take a look at the details:
print ncep_air2m.variables['air']
Notice that the dimensions are (12, 94, 192) -- meaning 12 months, 94 latitude points, 192 longitude points. Not the same grid as our model output!
The author of this notebook is Brian E. J. Rose, University at Albany.
It was developed in support of ATM 623: Climate Modeling, a graduate-level course in the Department of Atmospheric and Envionmental Sciences, offered in Spring 2015.